23 research outputs found

    Scope Graphs: The Story so Far

    Get PDF
    Static name binding (i.e., associating references with appropriate declarations) is an essential aspect of programming languages. However, it is usually treated in an unprincipled manner, often leaving a gap between formalization and implementation. The scope graph formalism mitigates these deficiencies by providing a well-defined, first-class, language-parametric representation of name binding. Scope graphs serve as a foundation for deriving type checkers from declarative type system specifications, reasoning about type soundness, and implementing editor services and refactorings. In this paper we present an overview of scope graphs, and, using examples, show how the ideas and notation of the formalism have evolved. We also briefly discuss follow-up research beyond type checking, and evaluate the formalism

    Dependently Typed Languages in Statix

    Get PDF
    Static type systems can greatly enhance the quality of programs, but implementing a type checker that is both expressive and user-friendly is challenging and error-prone. The Statix meta-language (part of the Spoofax language workbench) aims to make this task easier by automatically deriving a type checker from a declarative specification of a type system. However, so far Statix has not been used to implement dependent types, which is a class of type systems which require evaluation of terms during type checking. In this paper, we present an implementation of a simple dependently typed language in Statix, and discuss how to extend it with several common features such as inductive data types, universes, and inference of implicit arguments. While we encountered some challenges in the implementation, our conclusion is that Statix is already usable as a tool for implementing dependent types

    The relationship between sensory processing sensitivity and attention deficit hyperactivity disorder traits : a spectrum approach

    Get PDF
    The aim of the present study was to examine the relationship between sensory processing sensitivity (SPS) and symptoms of Attention Deficit Hyperactivity Disorder (ADHD) in adults. The Highly Sensitive Person Scale (HSPS) scale and the Adult ADHD Self-Report Scale (ASRS) were administered to a non-clinical group of 274 participants recruited from a university volunteers list. We found a highly significant positive correlation between number of self-reported ADHD traits and sensory sensitivity. Furthermore, ADHD traits and age were predictors of SPS and exploratory factor analysis revealed a factor that combined ADHD traits and items from the HSPS. The psychometric properties of the HSPS were also examined supporting the unidimensional nature of the concept. To our knowledge, this is the first study to identify a positive relationship between HSPS and ADHD traits in the general population. Our results further support recent findings suggesting abnormal sensory processing in ADHD

    Antiinflammatory Therapy with Canakinumab for Atherosclerotic Disease

    Get PDF
    Background: Experimental and clinical data suggest that reducing inflammation without affecting lipid levels may reduce the risk of cardiovascular disease. Yet, the inflammatory hypothesis of atherothrombosis has remained unproved. Methods: We conducted a randomized, double-blind trial of canakinumab, a therapeutic monoclonal antibody targeting interleukin-1β, involving 10,061 patients with previous myocardial infarction and a high-sensitivity C-reactive protein level of 2 mg or more per liter. The trial compared three doses of canakinumab (50 mg, 150 mg, and 300 mg, administered subcutaneously every 3 months) with placebo. The primary efficacy end point was nonfatal myocardial infarction, nonfatal stroke, or cardiovascular death. RESULTS: At 48 months, the median reduction from baseline in the high-sensitivity C-reactive protein level was 26 percentage points greater in the group that received the 50-mg dose of canakinumab, 37 percentage points greater in the 150-mg group, and 41 percentage points greater in the 300-mg group than in the placebo group. Canakinumab did not reduce lipid levels from baseline. At a median follow-up of 3.7 years, the incidence rate for the primary end point was 4.50 events per 100 person-years in the placebo group, 4.11 events per 100 person-years in the 50-mg group, 3.86 events per 100 person-years in the 150-mg group, and 3.90 events per 100 person-years in the 300-mg group. The hazard ratios as compared with placebo were as follows: in the 50-mg group, 0.93 (95% confidence interval [CI], 0.80 to 1.07; P = 0.30); in the 150-mg group, 0.85 (95% CI, 0.74 to 0.98; P = 0.021); and in the 300-mg group, 0.86 (95% CI, 0.75 to 0.99; P = 0.031). The 150-mg dose, but not the other doses, met the prespecified multiplicity-adjusted threshold for statistical significance for the primary end point and the secondary end point that additionally included hospitalization for unstable angina that led to urgent revascularization (hazard ratio vs. placebo, 0.83; 95% CI, 0.73 to 0.95; P = 0.005). Canakinumab was associated with a higher incidence of fatal infection than was placebo. There was no significant difference in all-cause mortality (hazard ratio for all canakinumab doses vs. placebo, 0.94; 95% CI, 0.83 to 1.06; P = 0.31). Conclusions: Antiinflammatory therapy targeting the interleukin-1β innate immunity pathway with canakinumab at a dose of 150 mg every 3 months led to a significantly lower rate of recurrent cardiovascular events than placebo, independent of lipid-level lowering. (Funded by Novartis; CANTOS ClinicalTrials.gov number, NCT01327846.

    Composable Type System Specification using Heterogeneous Scope Graphs

    No full text
    Static Analysis is of indispensable value for the robustness of software systems and the efficiency of developers. Moreover, many modern-day software systems are composed of interacting subsystems written in different programming languages. However, in most cases no static validation of these interactions is applied. In this thesis, we identify the Cross-Language Static Semantics Problem, which is defined as "How to provide a formal and executable specification of the static semantics of interactions between parts of a software system written in different languages?" We investigate current solutions to this problem, and propose criteria to which an all-encompassing solution to this problem must adhere. After that, we present a design pattern for the Statix meta-DSL for static semantics specification that allows to model loosely coupled, composable type system specifications. This pattern entails that the semantic concepts of a particular domain are encoded in an interface specification library, which is integrated in the type system of concrete languages. This allows controlled but automated composition of type systems. We show that, under some well-formedness criteria, the system provides correct results. A runtime, executing composed specifications, is implemented using PIE pipelines for partial incrementality, and integrated in the command-line interface and Eclipse IDE platforms, using the Spoofax 3 Framework. This allows using multi-language analysis in concrete projects. The design pattern, and the accompanying runtime are validated using two case studies. These case studies show that the approach is effective, even in a case where there is an impedance mismatch in the data models of the involved languages.Computer Scienc

    Mophasco (MOnadic framework for PHAsed name resolution using SCOpe graphs)

    No full text
    Artifact accompanying paper Casper Bach Poulsen, Aron Zwaan, Paul Hübner. 2023. “A Monadic Framework for Name Resolution in Multi-Phased Type Checkers”. In Proceedings of the 22nd ACM SIGPLAN International Conference on Generative Programming: Concepts and Experiences (GPCE '23), October 22--23, 2023, Cascais, Portugal. https://doi.org/10.1145/3624007.3624051 For more details, see the included READM

    Tooling to Detect Unwanted Thread Exits in Rust

    No full text
    Technolution is a company that specializes in building embedded and information systems, in which software plays a key role. Recently, Technolution is transitioning from the use of C in embedded systems, to Rust, a relatively new programming language developed by Mozilla. By design, Rust provides the programmer with higher security and reliability guarantees, such as memory safety, type safety and the absence of race conditions. These guarantees are ensured by means of an expressive ownership-based type system. However, it is impossible for the Rust type system to detect all errors statically. Hence, there are still many operations that contain dynamic checks to test for erroneous conditions. When such a check fails, an unrecoverable problem has been encountered and the current thread exits, this is called a panic in Rust. A panic causes the program to terminate, leading to a decrease in availability of the system. To avoid situations causing panic, Technolution wants tooling that detects possible ways a program could panic. For this purpose, we developed a static analysis tool: Rustig. When given a program, Rustig notifies the user of all the operations that either directly, or indirectly via another library, may cause a panic. The tools performs the analysis of panic calls in two stages. First, it builds a call graph from the executable of a Rust program, modelling functions as nodes and function calls as directed edges. Secondly, it performs an analysis on the call graph to determine which functions could cause panic. As part of the development of Rustig, we devised two new approaches. We have developed an approach to construct call graphs taking into account dynamic dispatch calls. This is based upon the assumption once a function address is loaded, it will also be called during execution. Furthermore, in order to efficiently analyze the call graph, a simplification of the all paths problem is proposed. In contrast with the all paths problem, the simplification is solvable in polynomial time. The approach involves finding the shortest path for every crossing edge on a graph cut

    Codebase Underlying the BSc Thesis: Type-Checking Modules and Imports using Scope Graphs

    No full text
    The objective of this research was to determine whether a stratified type-checking approach using scope graphs could type-check the proof-of-concept language LM. Scope graphs provide a way to type-check real-world programming languages and their constructs. Previous implementations that type-check LM, a language with relative, unordered, and glob imports, do not halt. This dataset contains Haskell code for constructing and type-checking a scope graph of an LM program. It is based on the Phased Haskell library. With this implementation, the all test cases halt and the majority exhibit the correct behaviour with only one false-negative.This implementation uses a five-step approach:Constructing a module hierarchy.Constructing a scope graph consisting of scopes and module sinks.Iteratively resolving imports and placing import edges in the scope graph.Adding all declarations of all modules to the scope graph.Type-checking the bodies of all declarations with respect to the scope graph.Many test cases bundled with this data set are based on those for LMR (which is very similar to LM) and can be found here. On top of that. more test cases were derived and included. All test cases are denoted as annotated terms.This data set is linked to a Bachelor's thesis completed at the EEMCS faculty at the TU Delft. A link will be added after publication.</p

    Codebase Underlying the BSc Thesis: Type-Checking Modules and Imports using Scope Graphs

    No full text
    The objective of this research was to determine whether a stratified type-checking approach using scope graphs could type-check the proof-of-concept language LM. Scope graphs provide a way to type-check real-world programming languages and their constructs. Previous implementations that type-check LM, a language with relative, unordered, and glob imports, do not halt. This dataset contains Haskell code for constructing and type-checking a scope graph of an LM program. It is based on the Phased Haskell library. With this implementation, the all test cases halt and the majority exhibit the correct behaviour with only one false-negative.This implementation uses a five-step approach:Constructing a module hierarchy.Constructing a scope graph consisting of scopes and module sinks.Iteratively resolving imports and placing import edges in the scope graph.Adding all declarations of all modules to the scope graph.Type-checking the bodies of all declarations with respect to the scope graph.Many test cases bundled with this data set are based on those for LMR (which is very similar to LM) and can be found here. On top of that. more test cases were derived and included. All test cases are denoted as annotated terms.This data set is linked to a Bachelor's thesis completed at the EEMCS faculty at the TU Delft. A link will be added after publication.</p

    Temperature as a ground for social proximity

    No full text
    Literature in interpersonal relations has described the sense of intimacy towards others in terms of physical closeness and warmth. Research suggests that these descriptions should be taken literally. Past work (IJzerman & Semin, 2009) revealed that temperature alterations affect the construal of social relations. Lakoff and Johnson (1999) suggest that such findings are unidirectional. However, recent research indicates that the recollection of social exclusion induces perceptions of lower temperature (Zhong & Leonardelli, 2008). In this work, we elaborate on these findings to provide new insights into processes central to interpersonal relations. In the current report, we hypothesized and found that a) actual physically induced experiences of proximity induce perceptions of higher temperature. Moreover, we show that b) verbally induced social proximity. © 2010 Elsevier Inc
    corecore